Gaussian Kullback-Leibler approximate inference
نویسندگان
چکیده
We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scalable are provided; the lower bound to the normalisation constant provided by G-KL methods is proven to dominate those provided by local lower bounding methods; complexity and model applicability issues of G-KL versus other Gaussian approximate inference methods are discussed. Numerical results comparing G-KL and other deterministic Gaussian approximate inference methods are presented for: robust Gaussian process regression models with either Student-t or Laplace likelihoods, large scale Bayesian binary logistic regression models, and Bayesian sparse linear models for sequential experimental design.
منابع مشابه
Tilted Variational Bayes
We present a novel method for approximate inference. Using some of the constructs from expectation propagation (EP), we derive a lower bound of the marginal likelihood in a similar fashion to variational Bayes (VB). The method combines some of the benefits of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likel...
متن کاملConcave Gaussian Variational Approximations for Inference in Large-Scale Bayesian Linear Models
Two popular approaches to forming bounds in approximate Bayesian inference are local variational methods and minimal KullbackLeibler divergence methods. For a large class of models we explicitly relate the two approaches, showing that the local variational method is equivalent to a weakened form of Kullback-Leibler Gaussian approximation. This gives a strong motivation to develop efficient meth...
متن کاملVariational methods for approximate reasoning in graphical models
Exact inference in large and complex graphical models (e.g. Bayesian networks) is computationally intractable. Approximate schemes are therefore of great importance for real world computation. In this paper we consider a general scheme in which the original intractable graphical model is approximated by a model with a tractable structure. The approximating model is optimised by an iterative pro...
متن کاملExpectation Propagation on the Maximum of Correlated Normal Variables
Many inference problems involving questions of optimality ask for the maximum or the minimum of a finite set of unknown quantities. This technical report derives the first two posterior moments of the maximum of two correlated Gaussian variables and the first two posterior moments of the two generating variables (corresponding to Gaussian approximations minimizing relative entropy). It is shown...
متن کاملNorges Teknisk-naturvitenskapelige Universitet Fitting Gaussian Markov Random Fields to Gaussian Fields Fitting Gaussian Markov Random Fields to Gaussian Fields Tmr Project on Spatial Statistics (erb-fmrx-ct960095) for Support and Inspiration
SUMMARY This paper discusses the following task often encountered building Bayesian spatial models: construct a homogeneous Gaussian Markov random field (GMRF) on a lattice with correlation properties either as present in observed data or consistent with prior knowledge. The Markov property is essential in design of computational efficient Markov chain Monte Carlo algorithms used to analyse suc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of Machine Learning Research
دوره 14 شماره
صفحات -
تاریخ انتشار 2013